A Batch Rival Penalized Expectation-Maximization Algorithm for Gaussian Mixture Clustering with Automatic Model Selection

نویسندگان

  • Jiechang Wen
  • Dan Zhang
  • Yiu-ming Cheung
  • Hailin Liu
  • Xinge You
چکیده

Within the learning framework of maximum weighted likelihood (MWL) proposed by Cheung, 2004 and 2005, this paper will develop a batch Rival Penalized Expectation-Maximization (RPEM) algorithm for density mixture clustering provided that all observations are available before the learning process. Compared to the adaptive RPEM algorithm in Cheung, 2004 and 2005, this batch RPEM need not assign the learning rate analogous to the Expectation-Maximization (EM) algorithm (Dempster et al., 1977), but still preserves the capability of automatic model selection. Further, the convergence speed of this batch RPEM is faster than the EM and the adaptive RPEM in general. The experiments show the superior performance of the proposed algorithm on the synthetic data and color image segmentation.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Expectation-MiniMax: A General Penalized Competitive Learning Approach to Clustering Analysis

In the literature, the Rival Penalized Competitive Learning (RPCL) algorithm (Xu et al. 1993) and its variants perform clustering analysis well without knowing the cluster number. However, such a penalization scheme is heuristically proposed without any theoretical guidance. In this paper, we propose a general penalized competitive learning approach named Expectation-MiniMax (EMM) Learning that...

متن کامل

High dimensional Sparse Gaussian Graphical Mixture Model

This paper considers the problem of networks reconstruction from heterogeneous data using a Gaussian Graphical Mixture Model (GGMM). It is well known that parameter estimation in this context is challenging due to large numbers of variables coupled with the degenerate nature of the likelihood. We propose as a solution a penalized maximum likelihood technique by imposing an l1 penalty on the pre...

متن کامل

Expectation-MiniMax Approach to Clustering Analysis

This paper proposes a general approach named ExpectationMiniMax (EMM) for clustering analysis without knowing the cluster number. It describes the contrast function of Expectation-Maximization (EM) algorithm by an approximate one with a designable error term. Through adaptively minimizing a specific error term meanwhile maximizing the approximate contrast function, the EMM automatically penaliz...

متن کامل

Data smoothing regularization, multi-sets-learning, and problem solving strategies

First, we briefly introduce the basic idea of data smoothing regularization, which was firstly proposed by Xu [Brain-like computing and intelligent information systems (1997) 241] for parameter learning in a way similar to Tikhonov regularization but with an easy solution to the difficulty of determining an appropriate hyper-parameter. Also, the roles of this regularization are demonstrated on ...

متن کامل

Regularized Parameter Estimation in High-Dimensional Gaussian Mixture Models

Finite gaussian mixture models are widely used in statistics thanks to their great flexibility. However, parameter estimation for gaussian mixture models with high dimensionality can be challenging because of the large number of parameters that need to be estimated. In this letter, we propose a penalized likelihood estimator to address this difficulty. The [Formula: see text]-type penalty we im...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره 2012  شماره 

صفحات  -

تاریخ انتشار 2012